Skip to content

Conversation

@usualoma
Copy link
Member

fixes #240

With this PR, you will be able to apply optimisations to access internal data in v24 by executing the following.

$ node --import @hono/node-server/setup ./app.mjs

What is this?

As pointed out by @timokoessler in #240, in Node.js v24.x, due to changes in nodejs/undici@45dceea, it is no longer possible to retrieve internal data using the previous node-server method.

Even when internal data cannot be retrieved, node-server continues to function normally; however, this may result in performance degradation for certain applications.

Another way to get internal data

This change to ‘undici’ has made it more difficult to obtain than before, but there are ways to do so. If you execute the following code snippet in v24, you will see that it can be obtained.

let getResponseState = null
const originalDeleteProperty = Reflect.deleteProperty
Reflect.deleteProperty = (target, prop) => {
  if (prop === 'getResponseState') {
    getResponseState = target.getResponseState
    Reflect.deleteProperty = originalDeleteProperty
  }
  return originalDeleteProperty(target, prop)
}

const res = new Response('Response Text')
console.log(getResponseState(res).body.source)

The difficulty of making this code work

In Node.js, the Response class is dynamically initialised when global.Response is referenced.
This means that the code snippet above must be executed before global.Response is referenced for the first time. However, when executing .mts or .mjs files, global.Response is initialised during import resolution in most applications, so simply writing the code inside node-server will not work as expected.
To ensure that it works correctly, users must explicitly specify the --import option at runtime.

$ node --import @hono/node-server/setup ./app.mjs

The option of giving up this optimisation

This optimisation was implemented in #145, but even without it, the speed in the ‘best case’ does not decrease. Therefore, I think there is an option to avoid making it as complex as in this PR and instead stop the optimisation that directly retrieves internal data to speed things up.

(However, if there is code that modifies the response headers, it will not achieve the ‘best case’ scenario and will remain a target for this optimisation, so I believe this optimisation applies to a fairly large number of applications.)

@usualoma
Copy link
Member Author

Hi @yusukebe,

I believe that the only way to resolve this issue is to proceed as follows.

$ node --import @hono/node-server/setup ./app.mjs

Do you think it is necessary to optimise in this complex manner?

The --import option is not necessary for those who do not require optimisation, so it may be best to leave it as it is, to be used only by those who wish to enable it.

@yusukebe
Copy link
Member

@usualoma

Sorry for being late. I'm a little busy now and will comment on it later. Thanks.

@yusukebe
Copy link
Member

Hi @usualoma !

I am taking several benchmarks. I want to confirm in which cases there is a difference between using --import @hono/node-server/setup and not using it. Is there a difference in the Hello World app?

@usualoma
Copy link
Member Author

Hi @yusukebe, Thank you for your comment.

We can see the difference in the following application. (The key point is res.headers.set('x-foo', 'bar').)

import { serve } from '@hono/node-server'
import { Hono } from 'hono'

const app = new Hono()

app.get('/', () => {
    const res = new Response('foo')
    res.headers.set('x-foo', 'bar')
    return res
})

serve({
    fetch: app.fetch,
    port: 3003,
})

with --import @hono/node-server/setup

$ docker run -v `pwd`:/app -w /app/hono -p 3003:3003 --rm -it node:24 node --import @hono/node-server/setup ./app.mjs
$ bombardier -d 10s --fasthttp http://localhost:3003/
Bombarding http://localhost:3003/ for 10s using 125 connection(s)
[====================================================================================] 10s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec     48408.15    5503.67   57826.03
  Latency        2.58ms     1.42ms   138.00ms
  HTTP codes:
    1xx - 0, 2xx - 483930, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:    11.03MB/s

without --import @hono/node-server/setup

$ docker run -v `pwd`:/app -w /app/hono -p 3003:3003 --rm -it node:24 node ./app.mjs
$ bombardier -d 10s --fasthttp http://localhost:3003/
Bombarding http://localhost:3003/ for 10s using 125 connection(s)
[====================================================================================] 10s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec     42615.06    6739.75   52799.27
  Latency        2.93ms     3.17ms   332.86ms
  HTTP codes:
    1xx - 0, 2xx - 426148, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:     9.71MB/s

@yusukebe
Copy link
Member

yusukebe commented May 16, 2025

Hi @usualoma

Thank you for the answer! This is a little bit difficult to decide. But it's okay to merge this PR!

I also took a benchmark. I can't say it too loudly, but... I believe the most important aspect of benchmarks is comparing them with other frameworks (not comparing Hono itself). In practical use, the difference may not be significant because the neck will be in other places, but when compared with other frameworks, it becomes important.

In the current hono implementation, creating the instance of Response and setting the headers after it is not a common case. We can use c.header() to avoid this. In the benchmark case, it always uses c.header().

Also, if users have problems in the real world, I think it's a good idea to use --import.

So, this PR is good. Can you make this PR ready and ping me?


The following are just references. I created an app that performs header addition using Hono, Express, and Fastify, and took benchmarks.

// Hono
app.get('/', (c) => {
  const res = new Response('hi')
  res.headers.set('x-foo', 'bar')
  return res
})
// Express
import express from 'express'

const app = express()
const port = 3001

app.get('/', (req, res) => {
  res.set('x-foo', 'bar')
  res.send('H!')
})

app.listen(port)
// Fastify
import Fastify from 'fastify'

const fastify = Fastify({ logger: false })

fastify.get('/', async (request, reply) => {
  reply.header('x-foo', 'bar')
  return 'H!'
})

fastify.listen({ port: 3002 })

The result is as follows. If you don't include --import, it will lose to Express.

CleanShot 2025-05-17 at 05 20 54@2x

The following is an example using c.header() and c.text().

app.get('/', (c) => {
  c.header('x-foo', 'bar')
  return c.text('H!')
})

CleanShot 2025-05-17 at 05 34 34@2x

As expected, using c.header and c.text() is fast.

@usualoma
Copy link
Member Author

Hi @yusukebe, Thank you for reviewing and confirming the performance!

Is it common for headers to be modified?

As @yusukebe mentioned, returning c.text('H!') in the handler avoids changing the header in many cases. However, when using middleware (from the perspective of Hono and the application, not node-server), it is often reasonable to rewrite the header. Even the middleware included in Hono, approaches that rewrite headers such as secure-headers and etag are adopted.

Headers and status codes are frequently referenced.

Middleware probably mainly references (and modifies) headers and status. For these two properties, it seems possible to prevent performance degradation (creation of globalThis.Response objects) when referenced, as shown in the following PR.

usualoma#3

It may not be necessary to refer to the internal data structure.

Even if we refer to headers and status, if we can still achieve a fast response, I feel that hacks such as getInternalBody are unnecessary.

#242

Current thoughts

I have been considering this PR to apply the same approach as getInternalBody to v24, but in order to maintain stable behavior in future versions and achieve performance that does not require the use of --import, I believe it would be best to adopt #242.

@yusukebe I'm sorry to present another proposal. What do you think?

@yusukebe
Copy link
Member

Hi @usualoma !

#242 is great! I feel that the implementation of #242 seems to be simpler than the current implementation. I thought it could also cache the url and statusText properties, but they will not be referred to often, and the value will change depending on the implementation of Request. Shall we proceed with #242?

@usualoma
Copy link
Member Author

Hi @yusukebe
Thanks for checking! Let's close this PR and move on to #242.

@usualoma usualoma closed this May 19, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Failed to find Response internal state key

2 participants